Web Survey Bibliography
Over the last decades, a tremendous increase has occurred in cross-national data production in social science research (Harkness 2008). The large-scale provision and the wide-spread use of cross-national data sets constitute a huge opportunity for the research community but also pose the challenge to develop cross-national comparable survey items (Lynn, Japec, and Lyberg 2006). At the same time, substantive researchers are increasingly aware of the necessity to understand respondents’ cognitive processes when answering a survey question (Smith et al. 2011). Recently, the method of online probing has been developed that implements probing techniques from cognitive interviewing in web surveys. In the traditional probing approach, interviewers obtain additional information by asking follow-up questions called probes (Beatty and Willis 2007). In contrast, online probing transfers probing questions as open-ended questions in the web context. It can reveal the cognitive processes of web survey participants and it helps to assess whether respondents’ interpretations of an item differ across countries (Braun et al. 2015).
The implementation of probes within web surveys offers respondents a higher level of anonymity of their answers in comparison to the laboratory situation during cognitive interviewing (Behr and Braun 2015), which potentially reduces social desirability effects in the response process (Bethlehem and Biffignandi 2012). Online probing can easily realize large samples sizes, which increases the generalizability of the results, enables an evaluation of the prevalence of problems or themes, and can explain the response patterns of specific subpopulations (Braun et al. 2015). Since all probes have to be programmed in advance, all respondents receive the same probe, and the procedure is highly standardized (Braun et al. 2015). When applied to cross-national data, online probing is a powerful tool to assess the comparability of questions. In contrast to traditional quantitative approaches to assess the equivalence of items (e.g., measurement invariance tests), online probing can explain why respondents in certain countries might misunderstand a specific item or why they adopt different perspectives when providing a response (Behr et al. 2014a).
The overarching goal of this dissertation project is to explore the potential of the method of online probing vis-à-vis other relevant methods that share similar goals (cognitive interviewing and measurement invariance tests) and as an assessment tool for single-item indicators in cross-national surveys. In particular, the dissertation addressed the following research questions: 1) Does online probing arrive at similar results than other methods? 2) Which are the strength and weaknesses of online probing in comparison to other methods? 3) How can online probing be combined with other methods in a mixed-methods approach? 3) How useful is online probing to assess the cross-national comparability of single-item indicators? Since the dissertation’s goal is to compare the methods of online probing, cognitive interviewing, and measurement invariance tests in regard to their potential to detect problematic issues at the item level, the field of national identity has been chosen as a substantive application for the method comparisons due to the existence of potentially problematic measures in a cross-national context. This dissertation focused on items from the 2013 International Social Survey Programme module on National Identity.
The first article of this dissertation (“Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?”; published in Field Methods) analyzed whether online probing and cognitive interviewing arrive at similar conclusion with regard to error detection and themes that are mentioned by respondents when applied to the same set of items (ISSP item battery on specific national pride). The study compares data from cognitive interviews conducted with 20 German respondents in April 2013 with a web survey conducted with 532 German respondents in September 2013. The article revealed that both methods share complementary strength and weaknesses. While probing answers in cognitive interviewing show indications for a higher response quality, online probing can compensate through a larger sample size. The article also provides the researcher with guidance which method is preferable in a given research situation and advocates the combination of both methods in a mixed-methods approach.
The second article of this dissertation (“Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary Tool”; forthcoming in Public Opinion Quarterly, “2016 AAPOR/WAPOR Janet A. Harkness Award” and “2016 QDET2 Monroe Sirken Innovative Paper Award for Young Scholars of Question Evaluation”) provides an example for a mixed-method approach that combines online probing with quantitative measurement invariance tests. With the examples of the concepts of constructive patriotism and nationalisms, this study explains how the combination of both methods can reveal incomparable items and countries but also explain issues related to cross-national comparability. By analyzing data from the 2013 ISSP and a web survey with 2,685 respondents from five countries, online probing discovered the reasons for missing comparability (varying lexical scope and silent misunderstanding of a key term) that was also detected during the measurement invariance tests.
Finally, the third article showed the potential of online probing for the assessment of the cross-national comparability of single-item indicators with the example of the general national pride item. Online probing provides a unique solution for the decision whether single-item indicators are equivalent because the traditional approach of measurement invariance tests presupposes multiple-indicator measures and is, therefore, inapplicable for single-item indicators. This study analyzed 2,685 probe responses from a web survey that was conducted in five countries. Online probing uncovered several potentially problematic issues and the fact that respondents in all countries associate various concepts with the general national pride item.
Therefore, the contribution of this dissertation is:
1. The insight that online probing arrives at similar results than cognitive interviewing and measurement invariance tests.
2. A clear understanding of the method’s strength and weaknesses vis-à-vis cognitive interviewing and measurement invariance tests.
3. An explanation of optimal implementations of online probing in a mixed-methods approach.
4. A demonstration of the usefulness of online probing to assess the cross-national comparability of single-item indicators.
5. But also, an assessment of the cross-national comparability of measures of national identity for substantive researchers.
Web survey bibliography (4086)
- Displaying Videos in Web Surveys: Implications for Complete Viewing and Survey Responses; 2017; Mendelson, J.; Lee Gibson, J.; Romano Bergstrom, J. C.
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Answering Without Reading: IMCs and Strong Satisficing in Online Surveys; 2017; Anduiza, E.; Galais, C.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Social desirability bias in self-reported well-being measures: evidence from an online survey; 2017; Caputo, A.
- Web-Based Survey Methodology; 2017; Wright, K. B.
- Handbook of Research Methods in Health Social Sciences; 2017; Liamputtong, P.
- Lessons from recruitment to an internet based survey for Degenerative Cervical Myelopathy: merits of...; 2017; Davies, B.; Kotter, M. R.
- Web Survey Gamification - Increasing Data Quality in Web Surveys by Using Game Design Elements; 2017; Schacht, S.; Keusch, F.; Bergmann, N.; Morana, S.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Comparability of web and telephone surveys for the measurement of subjective well-being; 2017; Sarracino, F.; Riillo, C. F. A.; Mikucka, M.
- Achieving Strong Privacy in Online Survey; 2017; Zhou, Yo.; Zhou, Yi.; Chen, S.; Wu, S. S.
- A Meta-Analysis of the Effects of Incentives on Response Rate in Online Survey Studies; 2017; Mohammad Asire, A.
- Telephone versus Online Survey Modes for Election Studies: Comparing Canadian Public Opinion and Vote...; 2017; Breton, C.; Cutler, F.; Lachance, S.; Mierke-Zatwarnicki, A.
- Examining Factors Impacting Online Survey Response Ratesin Educational Research: Perceptions of Graduate...; 2017; Saleh, A.; Bista, K.
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- Paradata as an aide to questionnaire design: Improving quality and reducing burden; 2017; Timm, E.; Stewart, J.; Sidney, I.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Interviewer Gender and Survey Responses: The Effects of Humanizing Cues Variations; 2017; Jablonski, W.; Krzewinska, A.; Grzeszkiewicz-Radulska, K.
- Millennials and emojis in Spain and Mexico.; 2017; Bosch Jover, O.; Revilla, M.
- Where, When, How and with What Do Panel Interviews Take Place and Is the Quality of Answers Affected...; 2017; Niebruegge, S.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Do distractions during web survey completion affect data quality? Findings from a laboratory experiment...; 2017; Wenz, A.
- Predicting Breakoffs in Web Surveys; 2017; Mittereder, F.; West, B. T.
- Measuring Subjective Health and Life Satisfaction with U.S. Hispanics; 2017; Lee, S.; Davis, R.
- Humanizing Cues in Internet Surveys: Investigating Respondent Cognitive Processes; 2017; Jablonski, W.; Grzeszkiewicz-Radulska, K.; Krzewinska, A.
- A Comparison of Emerging Pretesting Methods for Evaluating “Modern” Surveys; 2017; Geisen, E., Murphy, J.
- The Effect of Respondent Commitment on Response Quality in Two Online Surveys; 2017; Cibelli Hibben, K.
- Pushing to web in the ISSP; 2017; Jonsdottir, G. A.; Dofradottir, A. G.; Einarsson, H. B.
- The 2016 Canadian Census: An Innovative Wave Collection Methodology to Maximize Self-Response and Internet...; 2017; Mathieu, P.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- In search of best practices; 2017; Kappelhof, J. W. S.; Steijn, S.
- Redirected Inbound Call Sampling (RICS); A New Methodology ; 2017; Krotki, K.; Bobashev, G.; Levine, B.; Richards, S.
- An Empirical Process for Using Non-probability Survey for Inference; 2017; Tortora, R.; Iachan, R.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- A Comparison of Two Nonprobability Samples with Probability Samples; 2017; Zack, E. S.; Kennedy, J. M.
- Rates, Delays, and Completeness of General Practitioners’ Responses to a Postal Versus Web-Based...; 2017; Sebo, P.; Maisonneuve, H.; Cerutti, B.; Pascal Fournier, J.; Haller, D. M.
- Necessary but Insufficient: Why Measurement Invariance Tests Need Online Probing as a Complementary...; 2017; Meitinger, K.
- Nonresponse in Organizational Surveying: Attitudinal Distribution Form and Conditional Response Probabilities...; 2017; Kulas, J. T.; Robinson, D. H.; Kellar, D. Z.; Smith, J. A.
- Theory and Practice in Nonprobability Surveys: Parallels between Causal Inference and Survey Inference...; 2017; Mercer, A. W.; Kreuter, F.; Keeter, S.; Stuart, E. A.
- Is There a Future for Surveys; 2017; Miller, P. V.
- Reducing speeding in web surveys by providing immediate feedback; 2017; Conrad, F.; Tourangeau, R.; Couper, M. P.; Zhang, C.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- A Working Example of How to Use Artificial Intelligence To Automate and Transform Surveys Into Customer...; 2017; Neve, S.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Estimating the Impact of Measurement Differences Introduced by Efforts to Reach a Balanced Response...; 2017; Kappelhof, J. W. S.; De Leeuw, E. D.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.